Diffusion approximation of frequency sensitive competitive learning
نویسندگان
چکیده
The focus of this paper is a convergence study of the frequency sensitive competitive learning (FSCL) algorithm. We approximate the final phase of FSCL learning by a diffusion process described by the Fokker-Plank equation. Sufficient and necessary conditions are presented for the convergence of the diffusion process to a local equilibrium. The analysis parallels that by Ritter-Schulten (1988) for Kohonen's self-organizing map. We show that the convergence conditions involve only the learning rate and that they are the same as the conditions for weak convergence described previously. Our analysis thus broadens the class of algorithms that have been shown to have these types of convergence characteristics.
منابع مشابه
Regularized Autoregressive Multiple Frequency Estimation
The paper addresses a problem of tracking multiple number of frequencies using Regularized Autoregressive (RAR) approximation. The RAR procedure allows to decrease approximation bias, comparing to other AR-based frequency detection methods, while still providing competitive variance of sample estimates. We show that the RAR estimates of multiple periodicities are consistent in probabilit...
متن کاملConvergence Conditions for Frequency Sensitive Competitive Learning
We present suucient and necessary conditions for the convergence of Frequency Sensitive Competitive Learning (FSCL) algorithm to a local equilibrium. The nal phase of the FSCL convergence is analyzed by describing the process with a Fokker-Plank equation. The analysis parallels that by Ritter and Schulten for the KSFM algorithm. We show that the convergence conditions involve only the learning ...
متن کاملCodeword Distribution for Frequency Sensitive CompetitiveLearning with One Dimensional Input
We study the codeword distribution for a consciense type competitive learning algorithm, Frequency Sensitive Competitive Learning (FSCL), using one dimensional input data. We prove that the asymptotic codeword density in the limit of large number of codewords is given by a power law of the form Q(x) = C P (x) , where P (x) is the input data density and depends on the algorithm and the form of t...
متن کاملHigh-dimensional clustering using frequency sensitive competitive learning
In this paper a clustering algorithm for sparsely sampled high-dimensional feature spaces is proposed. The algorithm performs clustering by employing a distance measure that compensates for diierently sized clusters. A sequential version of the algorithm is constructed in the form of a frequency sensitive Competitive Learning scheme. Experiments are conducted on an artiicial gaussian data set a...
متن کاملImage compression using frequency sensitive competitive neural network
Vector Quantization is one of the most powerful techniques used for speech and image compression at medium to low bit rates. Frequency Sensitive Competitive Learning algorithm (FSCL) is particularly effective for adaptive vector quantization in image compression systems. This paper presents a compression scheme for grayscale still images, by using this FSCL method. In this paper, we have genera...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE transactions on neural networks
دوره 8 5 شماره
صفحات -
تاریخ انتشار 1997